Goto

Collaborating Authors

 efficient convolutional neural network


Learning Versatile Filters for Efficient Convolutional Neural Networks

Neural Information Processing Systems

This paper introduces versatile filters to construct efficient convolutional neural network. Considering the demands of efficient deep learning techniques running on cost-effective hardware, a number of methods have been developed to learn compact neural networks. Most of these works aim to slim down filters in different ways, e.g., investigating small, sparse or binarized filters. In contrast, we treat filters from an additive perspective. A series of secondary filters can be derived from a primary filter. These secondary filters all inherit in the primary filter without occupying more storage, but once been unfolded in computation they could significantly enhance the capability of the filter by integrating information extracted from different receptive fields. Besides spatial versatile filters, we additionally investigate versatile filters from the channel perspective. The new techniques are general to upgrade filters in existing CNNs. Experimental results on benchmark datasets and neural networks demonstrate that CNNs constructed with our versatile filters are able to achieve comparable accuracy as that of original filters, but require less memory and FLOPs.


ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

Neural Information Processing Systems

Convolutional neural networks (CNNs) have shown great capability of solving various artificial intelligence tasks. However, the increasing model size has raised challenges in employing them in resource-limited applications. In this work, we propose to compress deep models by using channel-wise convolutions, which replace dense connections among feature maps with sparse ones in CNNs. Based on this novel operation, we build light-weight CNNs known as ChannelNets.


Reviews: ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

Neural Information Processing Systems

The paper proposes channel-wise convolutions that address the full connections between feature maps and replace them with sparse connections (based on 1-D convolutions). This reduces the #params and #FLOPS significantly; while maintaining high accuracy. The authors show results on imagenet classification and compare it to VGG/MobileNet variants to demonstrate this. Strengths: The paper is well written and easy to follow. Background and related work such as standard convolution fc layers used in neural nets; mobilenet and shufflenet variants to reduce computation are described in sufficient detail.


Reviews: Learning Versatile Filters for Efficient Convolutional Neural Networks

Neural Information Processing Systems

The paper introduces two new types of convolutional filters, named versatile filters, which can reduce the memory and FLOP requirements of conv. The method is simple and, according to the experimental results, it seems to be effective. The text quality is OK, although it would definitively benefit from better explanations about the proposed method. For instance, Figure 1 is a bit confusing (Are you showing 4 different filters in Fig 1 (b)?). My main concerns about this paper are related to the experiments and results, as detailed in the following questions: (1) Regarding the FLOP reduction, it is not clear how the reduction in the number of computations is really achieved.


Fruit and Vegetable identification system using efficient convolutional Neural Networks for…

#artificialintelligence

Multiple neurons with multiple hidden layers form Neural Network. We use Neural Networks for Auto Feature Selection. Features are the variables that impact the predictions or outcomes. A layer consists of small individual units called neurons. Neurons are used in model creation in NN which calculates the weights and bias and then passes the data to the next neuron.


ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

Gao, Hongyang, Wang, Zhengyang, Ji, Shuiwang

Neural Information Processing Systems

Convolutional neural networks (CNNs) have shown great capability of solving various artificial intelligence tasks. However, the increasing model size has raised challenges in employing them in resource-limited applications. In this work, we propose to compress deep models by using channel-wise convolutions, which replace dense connections among feature maps with sparse ones in CNNs. Based on this novel operation, we build light-weight CNNs known as ChannelNets. Compared to prior CNNs designed for mobile devices, ChannelNets achieve a significant reduction in terms of the number of parameters and computational cost without loss in accuracy.


Learning Versatile Filters for Efficient Convolutional Neural Networks

Wang, Yunhe, Xu, Chang, XU, Chunjing, Xu, Chao, Tao, Dacheng

Neural Information Processing Systems

This paper introduces versatile filters to construct efficient convolutional neural network. Considering the demands of efficient deep learning techniques running on cost-effective hardware, a number of methods have been developed to learn compact neural networks. Most of these works aim to slim down filters in different ways, e.g., investigating small, sparse or binarized filters. In contrast, we treat filters from an additive perspective. A series of secondary filters can be derived from a primary filter.



[R] [1707.01083] ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices • r/MachineLearning

@machinelearnbot

I am one of the authors of ShuffleNet. We've designed a new convolutional neural network structure for mobile platforms which utilizes pointwise group convolution and channel shuffle. Under the budget of 40MFLOPS, we've achieved 6.7% absolute top-1 error reduction on ImageNet classification compared to MobileNets. Empirically, our network with approximately the same error runs 13x faster than AlexNet on an ARM platform.